Gaussian mixtures: entropy and geometric inequalities

نویسندگان

  • Alexandros Eskenazis
  • Piotr Nayar
  • Tomasz Tkocz
چکیده

A symmetric random variable is called a Gaussian mixture if it has the same distribution as the product of two independent random variables, one being positive and the other a standard Gaussian random variable. Examples of Gaussian mixtures include random variables with densities proportional to e−|t| p and symmetric p-stable random variables, where p ∈ (0, 2]. We obtain various sharp moment and entropy comparison estimates for weighted sums of independent Gaussian mixtures and investigate extensions of the B-inequality and the Gaussian correlation inequality in the context of Gaussian mixtures. We also obtain a correlation inequality for symmetric geodesically convex sets in the unit sphere equipped with the normalized surface area measure. We then apply these results to derive sharp constants in Khintchine inequalities for vectors uniformly distributed on the unit balls with respect to p-norms and provide short proofs to new and old comparison estimates for geometric parameters of sections and projections of such balls.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

Information-theoretic measures such as the entropy, cross-entropy and the KullbackLeibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using vari...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

Weighted Gaussian entropy and determinant inequalities

We produce a series of results extending information-theoretical inequalities (discussed by Dembo–Cover–Thomas in 1989-1991) to a weighted version of entropy. The resulting inequalities involve the Gaussian weighted entropy; they imply a number of new relations for determinants of positive-definite matrices.

متن کامل

On w-mixtures: Finite convex combinations of prescribed component distributions

We consider the space of w-mixtures that are the set of finite statistical mixtures sharing the same prescribed component distributions. The geometry induced by the Kullback-Leibler (KL) divergence on this family of w-mixtures is a dually flat space in information geometry called the mixture family manifold. It follows that the KL divergence between two w-mixtures is equivalent to a Bregman Div...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1611.04921  شماره 

صفحات  -

تاریخ انتشار 2016